This paper discusses the issue of the self-heating effect of resistance sensors during temperature measurement. The self-heating effect causes temperature measurement errors. The aim of this work was to develop a method for in situ assessment of the thermal resistance between a self-heating thermometer and its surrounding environment, the temperature of which is measured. The proposed method is used to assess the uncertainty resulting from the heat transfer from the thermometer to the surrounding environment, which allows increased measurement accuracy. The proposed method consists of experimental determination of the sensor’s temperature characteristics in relation to the heating power for different values of the measuring current. Sample measurements were carried out on a representative group of resistance temperature sensors. The relationship of the internal thermal resistance to the type of sensor design and the relationship of the external resistance to the ambient conditions were demonstrated. The developed method allows the appropriate measuring current of the resistance temperature sensor to be selected according to its design, the mounting method, and the environmental conditions, which ensures that measurement errors are maintained at an appropriately low level.
Loading....